منابع مشابه
Minimally Supervised Number Normalization
We propose two models for verbalizing numbers, a key component in speech recognition and synthesis systems. The first model uses an end-to-end recurrent neural network. The second model, drawing inspiration from the linguistics literature, uses finite-state transducers constructed with a minimal amount of training data. While both models achieve near-perfect performance, the latter model can be...
متن کاملSupervised normalization of microarrays
MOTIVATION A major challenge in utilizing microarray technologies to measure nucleic acid abundances is 'normalization', the goal of which is to separate biologically meaningful signal from other confounding sources of signal, often due to unavoidable technical factors. It is intuitively clear that true biological signal and confounding factors need to be simultaneously considered when performi...
متن کاملAlgorithms for minimally supervised learning
The past few decades have brought substantial progress in the mathematical analysis of supervised learning. This is a paradigm in which a learner is provided with a data set consisting of points x and their labels (or response values) y, and is tasked with finding a suitable classifier (or regressor) that maps x → y. There are many popular types of classifiers—decision trees, linear separators,...
متن کاملMinimally Supervised Event Causality Identification
This paper develops a minimally supervised approach, based on focused distributional similarity methods and discourse connectives, for identifying of causality relations between events in context. While it has been shown that distributional similarity can help identifying causality, we observe that discourse connectives and the particular discourse relation they evoke in context provide additio...
متن کاملMinimally-Supervised Morphological Segmentation using Adaptor Grammars
This paper explores the use of Adaptor Grammars, a nonparametric Bayesian modelling framework, for minimally supervised morphological segmentation. We compare three training methods: unsupervised training, semisupervised training, and a novel model selection method. In the model selection method, we train unsupervised Adaptor Grammars using an over-articulated metagrammar, then use a small labe...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Transactions of the Association for Computational Linguistics
سال: 2016
ISSN: 2307-387X
DOI: 10.1162/tacl_a_00114